19 research outputs found

    Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning

    Get PDF
    Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows a high storage capacity of distributed associative memory where the codevectors may be stored. In contrast to known binding procedures, Context-Dependent Thinning preserves the same low density (or sparseness) of the bound codevector for varied number of component codevectors. Besides, a bound codevector is not only similar to another one with similar component codevectors (as in other schemes), but it is also similar to the component codevectors themselves. This allows the similarity of structures to be estimated just by the overlap of their codevectors, without retrieval of the component codevectors. This also allows an easy retrieval of the component codevectors. Examples of algorithmic and neural-network implementations of the thinning procedures are considered. We also present representation examples for various types of nested structured data (propositions using role-filler and predicate-arguments representation schemes, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations of traditional AI, as well as to the localist and microfeature-based connectionist representations

    On Handling Replay Attacks in Intrusion Detection Systems

    Get PDF
    We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system

    Training a Linear Neural Network with a Stable LSP Solution for Jamming Cancellation

    Get PDF
    Two jamming cancellation algorithms are developed based on a stable solution of least squares problem (LSP) provided by regularization. They are based on filtered singular value decomposition (SVD) and modifications of the Greville formula. Both algorithms allow an efficient hardware implementation. Testing results on artificial data modeling difficult real-world situations are also provided

    Approaches to Sequence Similarity Representation

    Get PDF
    We discuss several approaches to similarity preserving coding of symbol sequences and possible connections of their distributed versions to metric embeddings. Interpreting sequence representation methods with embeddings can help develop an approach to their analysis and may lead to discovering useful properties

    Analogical Reasoning Techniques in Intelligent Counterterrorism Systems

    Get PDF
    The paper develops a set of ideas and techniques supporting analogical reasoning throughout the life-cycle of terrorist acts. Implementation of these ideas and techniques can enhance the intellectual level of computer-based systems for a wide range of personnel dealing with various aspects of the problem of terrorism and its effects. The method combines techniques of structure-sensitive distributed representations in the framework of Associative-Projective Neural Networks, and knowledge obtained through the progress in analogical reasoning, in particular the Structure Mapping Theory. The impact of these analogical reasoning tools on the efforts to minimize the effects of terrorist acts on civilian population is expected by facilitating knowledge acquisition and formation of terrorism-related knowledge bases, as well as supporting the processes of analysis, decision making, and reasoning with those knowledge bases for users at various levels of expertise before, during, and after terrorist acts

    Representation and processing of structures with binary sparse distributed codes

    No full text
    The schemes for compositional distributed representations include those allowing on-the-fly construction of fixed dimensionality codevectors to encode structures of various complexity. Similarity of such codevectors takes into account both structural and semantic similarity of represented structures. In this paper we provide a comparative description of sparse binary distributed representation developed in the frames of the Associative-Projective Neural Network architecture and more well-known Holographic Reduced Representations of Plate and Binary Spatter Codes of Kanerva. The key procedure in Associative-Projective Neural Networks is Context-Dependent Thinning which binds codevectors and maintains their sparseness. The codevectors are stored in structured memory array which can be realized as distributed auto-associative memory. Examples of distributed representation of structured data are given. Fast estimation of similarity of analogical episodes by the overlap of their codevectors is used in modeling of analogical reasoning for retrieval of analogs from memory and for analogical mapping

    272 International Journal "Information Theories & Applications " Vol.13 APPROACHES TO SEQUENCE SIMILARITY REPRESENTATION

    No full text
    Abstract: We discuss several approaches to similarity preserving coding of symbol sequences and possible connections of their distributed versions to metric embeddings. Interpreting sequence representation methods with embeddings can help develop an approach to their analysis and may lead to discovering useful properties

    Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning

    No full text
    Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations of traditional AI, as well as to the localist and microfeature-based connectionist representations

    A Neural Network for Segmentation of Line Drawings into Lines of Different Orientations

    No full text
    A neural network is described which is designed for segmentation of binary line drawings into separate line segments according to their orientations. The network consists of several neural layers; each layer serves for extracting line segments of a certain orientation. Every neural layer has one-to-one correspondence with a retina. The task is performed by means of the interative procedure which includes interactions of neurons inside each layer through oriented excitatory connections and inhibitory interrelations between the corresponding neurons of different layers. The network is intended to produce orientation features as a result of the drawing partition. These orientation features should be used for recognition of line drawings. Computer simulation of the network was performed. Experiments showed that the network provides reasonable segmentation of separate handwritten characters
    corecore